Fast ABC-Boost for Multi-Class Classification

نویسنده

  • Ping Li
چکیده

Abc-boost is a new line of boosting algorithms for multi-class classification, by utilizing the commonly used sum-to-zero constraint. To implement abc-boost, a base class must be identified at each boosting step. Prior studies used a very expensive procedure based on exhaustive search for determining the base class at each boosting step. Good testing performance of abc-boost (implemented as abc-mart and abc-logitboost) on a variety of datasets was reported. For large datasets, however, the exhaustive search strategy adopted in prior abc-boost algorithms can be too prohibitive. To overcome this serious limitation, this paper suggests a heuristic by introducing Gaps when computing the base class during training. That is, we update the choice of the base class only for every G boosting steps (i.e., G = 1 in prior studies). We test this idea on large datasets (Covertype and Poker) as well as datasets of moderate size. Our preliminary results are very encouraging. On the large datasets, when G ≤ 100 (or even larger), there is essentially no loss of test accuracy compared to using G = 1. On the moderate datasets, no obvious loss of test accuracy is observed when G ≤ 20 ∼ 50. Therefore, aided by this heuristic of using gaps, it is promising that abc-boost will be a practical tool for accurate multi-class classification.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Adaptive Base Class Boost for Multi-class Classification

We develop the concept of ABC-Boost (Adaptive Base Class Boost) for multi-class classification and present ABC-MART, a concrete implementation of ABC-Boost. The original MART (Multiple Additive Regression Trees) algorithm has been very successful in large-scale applications. For binary classification, ABC-MART recovers MART. For multi-class classification, ABC-MART considerably improves MART, a...

متن کامل

Robust LogitBoost and Adaptive Base Class (ABC) LogitBoost

Logitboost is an influential boosting algorithm for classification. In this paper, we develop robust logitboost to provide an explicit formulation of tree-split criterion for building weak learners (regression trees) for logitboost. This formulation leads to a numerically stable implementation of logitboost. We then propose abc-logitboost for multi-class classification, by combining robust logi...

متن کامل

ABC-LogitBoost for Multi-class Classification

We develop abc-logitboost, based on the prior work on abc-boost[10] and robust logitboost[11]. Our extensive experiments on a variety of datasets demonstrate the considerable improvement of abc-logitboost over logitboost and abc-mart.

متن کامل

ارائه رهیافتی جدید برای مقایسه نتایج بکارگیری مدلهای طبقه بندی ABC چند معیاره موجودی (مطالعه موردی: شرکت سایپا)

About of Multi - Criteria ABC Inventory Classification, various models have been presented by researchers. Different results of items classification in these models have created a challenge for researchers. In this paper integrated techniques are used in order to compare the models results of Multi - Criteria ABC Inventory Classification. Presented model for determining the most appropriate mod...

متن کامل

An integrated model for solving the multiple criteria ABC inventory classification problem

In this paper, we present an integrated version of the Ng model and Zhou and Fan model [W. L. Ng,A simple classifier for multiple criteria ABC analysis, European Journal of Operation Research, 177(2007) 344-353; P. Zhou & L. Fan, A note on multi-criteria ABC inventory classification usingweighted linear optimization, European Journal of Operation Research, 182 (2007) 1488-1491]. Themodel that N...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • CoRR

دوره abs/1006.5051  شماره 

صفحات  -

تاریخ انتشار 2010